Subspace acceleration for the Crawford number and related eigenvalue optimization problems

نویسندگان

  • Daniel Kressner
  • Ding Lu
  • Bart Vandereycken
چکیده

This paper is concerned with subspace acceleration techniques for computing the Crawford number, that is, the distance between zero and the numerical range of a matrix A. Our approach is based on an eigenvalue optimization characterization of the Crawford number. We establish local convergence of order 1 + √ 2 ≈ 2.4 for an existing subspace method applied to such and other eigenvalue optimization problems involving a Hermitian matrix that depends analytically on one parameter. For the particular case of the Crawford number, we show that the relevant part of the objective function is strongly concave. In turn, this enables us to develop a subspace method that only uses three-dimensional subspaces but still achieves global convergence and a local convergence that is at least quadratic. A number of numerical experiments confirm our theoretical results and reveal that the established convergence orders appear to be tight.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Inexact Inverse Subspace Iteration for Generalized Eigenvalue Problems

In this paper, we represent an inexact inverse subspace iteration method for computing a few eigenpairs of the generalized eigenvalue problem Ax = Bx [Q. Ye and P. Zhang, Inexact inverse subspace iteration for generalized eigenvalue problems, Linear Algebra and its Application, 434 (2011) 1697-1715 ]. In particular, the linear convergence property of the inverse subspace iteration is preserved.

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

Some new restart vectors for explicitly restarted Arnoldi method

The explicitly restarted Arnoldi method (ERAM) can be used to find some eigenvalues of large and sparse matrices. However, it has been shown that even this method may fail to converge. In this paper, we present two new methods to accelerate the convergence of ERAM algorithm. In these methods, we apply two strategies for the updated initial vector in each restart cycles. The implementation of th...

متن کامل

Estimating the Number of Wideband Radio Sources

In this paper, a new approach for estimating the number of wideband sources is proposed which is based on RSS or ISM algorithms. Numerical results show that the MDL-based and EIT-based proposed algorithm havea much better detection performance than that in EGM and AIC cases for small differences between the incident angles of sources. In addition, for similar conditions, RSS algorithm offers hi...

متن کامل

A Symplectic Perspective on Constrained Eigenvalue Problems

The Maslov index is a powerful tool for computing spectra of selfadjoint, elliptic boundary value problems. This is done by counting intersections of a fixed Lagrangian subspace, which designates the boundary condition, with the set of Cauchy data for the differential operator. We apply this methodology to constrained eigenvalue problems, in which the operator is restricted to a (not necessaril...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017